Square Unit Augmented, Radially Extended, Multilayer Perceptrons
نویسنده
چکیده
Consider a multilayer perceptron (MLP) with d inputs, a single hidden sigmoidal layer and a linear output. By adding an additional d inputs to the network with values set to the square of the rst d inputs, properties reminiscent of higher-order neural networks and radial basis function networks (RBFN) are added to the architecture with little added expense in terms of weight requirements. Of particular interest , this architecture has the ability to form localized features in a d-dimensional space with a single hidden node but can also span large volumes of the input space; thus, the architecture has the localized properties of an RBFN but does not suuer as badly from the curse of dimen-sionality. I refer to a network of this type as a SQuare Unit Augmented, Radially Extended, MultiLayer Perceptron (SQUARE-MLP or SMLP).
منابع مشابه
Estimating the Number of Components in a Mixture of Multilayer Perceptrons
BIC criterion is widely used by the neural-network community for model selection tasks, although its convergence properties are not always theoretically established. In this paper we will focus on estimating the number of components in a mixture of multilayer perceptrons and proving the convergence of the BIC criterion in this frame. The penalized marginal-likelihood for mixture models and hidd...
متن کاملTraining Multilayer Perceptrons with the Extende Kalman Algorithm
A large fraction of recent work in artificial neural nets uses multilayer perceptrons trained with the back-propagation algorithm described by Rumelhart et. a1. This algorithm converges slowly for large or complex problems such as speech recognition, where thousands of iterations may be needed for convergence even with small data sets. In this paper, we show that training multilayer perceptrons...
متن کاملMultilayer Perceptrons with Radial Basis Functions as Value Functions in Reinforcement Learning
Using multilayer perceptrons (MLPs) to approximate the state-action value function in reinforcement learning (RL) algorithms could become a nightmare due to the constant possibility of unlearning past experiences. Moreover, since the target values in the training examples are bootstraps values, this is, estimates of other estimates, the chances to get stuck in a local minimum are increased. The...
متن کاملRecognition of Handwritten Digits Using Multilayer Perceptrons
Neural networks are often used for pattern recognition. They prove to be a popular choice for OCR (Optical Character Recognition) systems, especially when dealing with the recognition of printed text. In this paper, multilayer perceptrons are used for the recognition of handwritten digits. The accuracy achieved proves that this application is a working prototype that can be further extended int...
متن کاملEfficient High-precision Boilerplate Detection Using Multilayer Perceptrons
Removal of boilerplate is among the essential tasks in web corpus construction and web indexing. In this paper, we present an improved machine learning approach to general-purpose boilerplate detection for languages based on (extended) Latin alphabets (easily adaptable to other scripts). We keep it highly efficient (around 320 documents per single CPU core second) by using an optimized Multilay...
متن کامل